Fair Principal Component Analysis and Filter Design

نویسندگان

چکیده

We consider Fair Principal Component Analysis (FPCA) and search for a low dimensional subspace that spans multiple target vectors in fair manner. FPCA is defined as non-concave maximization of the worst projected norm within given set. The problem arises filter design signal processing, when incorporating fairness into dimensionality reduction schemes. state art approach to via semidefinite relaxation involves polynomial yet computationally expensive optimization. To allow scalability, we propose address using naive sub-gradient descent. analyze landscape underlying optimization case orthogonal targets. prove benign all local minima are globally optimal. Interestingly, SDR leads sub-optimal solutions this simple case. Finally, discuss equivalence between normalized tight frames.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convex Formulations for Fair Principal Component Analysis

Though there is a growing body of literature on fairness for supervised learning, the problem of incorporating fairness into unsupervised learning has been less well-studied. This paper studies fairness in the context of principal component analysis (PCA). We first present a definition of fairness for dimensionality reduction, and our definition can be interpreted as saying that a reduction is ...

متن کامل

Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...

متن کامل

Multi-dimensional, paraunitary principal component filter banks

z In this paper, the one-dimensional principal component lter banks (PCFB's) derived in 17] are generalized to higher dimensions. As presented in 17], PCFB's minimize the mean-squared error (MSE) when only Q out of P subbands are retained. Previously, 2D PCFB's were proposed in 16]. The work in 16] was limited to 2D signals and separable resampling operators. The formulation presented here is g...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2021

ISSN: ['1053-587X', '1941-0476']

DOI: https://doi.org/10.1109/tsp.2021.3099983